AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Dutch Pretrained

# Dutch Pretrained

Robbert 2023 Dutch Large
MIT
RobBERT-2023 is a Dutch language model based on the RoBERTa architecture, developed by KU Leuven, Ghent University, and TU Berlin, and is one of the state-of-the-art language models for Dutch.
Large Language Model Transformers Other
R
DTAI-KULeuven
627
20
Robbert V2 Dutch Ner
MIT
RobBERT is the state-of-the-art Dutch BERT model, pretrained on a large scale and adaptable to various text tasks through fine-tuning.
Large Language Model Other
R
pdelobelle
76.94k
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase